Project-Team Sierra
Team, Visitors, External Collaborators
Overall Objectives
Statement
Research Program
Supervised Learning
Unsupervised Learning
Parsimony
Optimization
Application Domains
Applications for Machine Learning
Highlights of the Year
New Software and Platforms
ProxASAGA
object-states-action
New Results
On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport
Sharp Analysis of Learning with Discrete Losses
Gossip of Statistical Observations using Orthogonal Polynomials
Marginal Weighted Maximum Log-likelihood for Efficient Learning of Perturb-and-Map models
Slice inverse regression with score functions
Constant Step Size Stochastic Gradient Descent for Probabilistic Modeling
Nonlinear Acceleration of Momentum and Primal-Dual Algorithms
Nonlinear Acceleration of Deep Neural Networks
Nonlinear Acceleration of CNNs
Robust Seriation and Applications To Cancer Genomics
Reconstructing Latent Orderings by Spectral Clustering
Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees
Efficient First-order Methods for Convex Minimization: a Constructive Approach
Operator Splitting Performance Estimation: Tight contraction factors and optimal parameter selection
Finite-sample Analysis of M-estimators using Self-concordance
Uniform regret bounds over
R
d
for the sequential linear regression problem with the square loss
Efficient online algorithms for fast-rate regret bounds under sparsity.
Exponential convergence of testing error for stochastic gradient methods
Statistical Optimality of Stochastic Gradient Descent on Hard Learning Problems through Multiple Passes
Central Limit Theorem for stationary Fleming–Viot particle systems in finite spaces
SeaRNN: Improved RNN training through Global-Local Losses
Improved asynchronous parallel optimization analysis for stochastic incremental methods
Asynchronous optimisation for Machine Learning
M
*
-Regularized Dictionary Learning
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks
Relating Leverage Scores and Density using Regularized Christoffel Functions
Averaging Stochastic Gradient Descent on Riemannian Manifolds
Localized Structured Prediction
Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
Differential Properties of Sinkhorn Approximation for Learning with Wasserstein Distance
Learning with SGD and Random Features
Manifold Structured Prediction
On Fast Leverage Score Sampling and Optimal Learning
Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
Bilateral Contracts and Grants with Industry
Bilateral Contracts with Industry
Bilateral Grants with Industry
Partnerships and Cooperations
National Initiatives
European Initiatives
International Initiatives
International Research Visitors
Dissemination
Promoting Scientific Activities
Teaching - Supervision - Juries
Popularization
Bibliography
Publications of the year
Inria
|
Raweb 2018
|
Presentation of the Project-Team SIERRA
|
SIERRA Web Site
PDF
e-Pub
Previous |
Home
| Next
Section: Partnerships and Cooperations
National Initiatives
Alexandre d'Aspremont: IRIS, PSL “Science des données, données de la science”.
Previous |
Home
| Next